In information theory, given an unknown stationary source π with alphabet A, and a sample w from π, the Krichevsky–Trofimov (KT) estimator produces an estimate πi(w) of the probabilities of each symbol i ∈ A. This estimator is optimal in the sense that it minimizes the worst-case regret.
For a binary alphabet, and a string w with m zeroes and n ones, the KT estimator can be defined recursively[1] as: